A Japanese large language model with 7.2 billion parameters developed by the Large Language Model R & D Center of the National Institute of Informatics in Japan. Based on the Transformer architecture, it supports Japanese, English, Chinese, Korean, and code processing, and is released under the Apache 2.0 license.
Natural Language Processing
TransformersMultiple Languages